Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Syntax-enhanced semantic parsing with syntax-aware representation
XIE Defeng, JI Jianmin
Journal of Computer Applications    2021, 41 (9): 2489-2495.   DOI: 10.11772/j.issn.1001-9081.2020111863
Abstract366)      PDF (1042KB)(314)       Save
Syntactic information, which is syntactic structure relations or dependency relations between words of a complete sentence, is an important and effective reference in Natural Language Processing (NLP). The task of semantic parsing is to directly transform natural language sentences into semantically complete and computer-executable languages. In previous semantic parsing studies, there are few efforts on improving the efficiency of end-to-end semantic parsing by using syntactic information from input sources. To further improve the accuracy and efficiency of the end-to-end semantic parsing model, a semantic parsing method was proposed to utilize the source-side dependency relation information of syntax to improve the model efficiency. As the basic idea of the method, an end-to-end dependency relation parser was pre-trained firstly. Then, the middle representation of the parser was used as syntax-aware representation, which was spliced with the original word embedding representation to generate a new input embedding representation, and this obtained input embedding representation was used in the end-to-end semantic parsing model. Finally, the model fusion was carried out by the transductive fusion learning. In the experiments, the proposed model was compared with the baseline model Transformer and the related works in the past decade. Experimental results show that, on ATIS, GEO and JOBS datasets, the semantic parsing model integrating dependency syntax-aware representation and transductive fusion learning achieves the best accuracy of 89.1%, 90.7%, and 91.4% respectively, which exceeds the performance of the Transformer. It verifies the effectiveness of introducing the dependency relation information of syntax.
Reference | Related Articles | Metrics
Adversarial negative sample generation for knowledge representation learning
ZHANG Zhao, JI Jianmin, CHEN Xiaoping
Journal of Computer Applications    2019, 39 (9): 2489-2493.   DOI: 10.11772/j.issn.1001-9081.2019020357
Abstract1528)      PDF (827KB)(874)       Save

Knowledge graph embedding is to embed symbolic relations and entities of the knowledge graph into low dimensional continuous vector space. Despite the requirement of negative samples for training knowledge graph embedding models, only positive examples are stored in the form of triplets in most knowledge graphs. Moreover, negative samples generated by negative sampling of conventional knowledge graph embedding methods are easy to be discriminated by the model and contribute less and less as the training going on. To address this problem, an Adversarial Negative Generator (ANG) model was proposed. The generator applied the encoder-decoder pipeline, the encoder readed in positive triplets whose head or tail entities were replaced as context information, and then the decoder filled the replaced entity with the triplet using the encoding information provided by the encoder, so as to generate negative samples. Several existing knowledge graph embedding models were used to play an adversarial game with the proposed generator to optimize the knowledge representation vectors. By comparing with existing knowledge graph embedding models, it can be seen that the proposed method has better mean ranking of link prediction and more accurate triple classification result on FB15K237, WN18 and WN18RR datasets.

Reference | Related Articles | Metrics